A python client for interacting with Ahnlich DB and AI
Project description
Ahnlich Client PY
A Python client that interacts with both ahnlich DB and AI
Usage Overview
The following topics are covered:
Installation
- Using Poetry
poetry add ahnlich-client-py
- Using pip
pip3 install ahnlich-client-py
Package Information
The ahnlich client has some noteworthy modules that should provide some context
- Bincode
- Serde Types
- Serde Binary
The above mentioned are classes generated by serde_generate
to help represent the primitive rust types and provide a base bincode serialization capabilities
- Query: Generated from the spec document, contains all the types used by to send a request to the ahnlich database
- Server Response: Generated from the spec document, contains all the possible server response.
- Builders:
- Exceptions: Possible Client Exceptions
- Libs: Contains helpers, such as
create_store_key
Server Response
All query types have an associating server response, all which can be found
from ahnlich_client_py import server_response
Initialization
Client
- Blocking clients
from ahnlich_client_py import AhnlichDBClient
client = AhnlichDBClient(address="127.0.0.1", port=port)
- Nonblocking clients
from ahnlich_client_py.non_blocking_client import AhnlichDBClient
client = AhnlichDBClient(address="127.0.0.1", port=port)
Connection Pooling
The ahnlich client has the ability to reuse connections. Configurations can be changed by overiding the default class initialization.
@dataclass
class AhnlichDBPoolSettings:
idle_timeout: float = 30.0
max_lifetime: float = 600.0
min_idle_connections: int = 3
max_pool_size: int = 10
enable_background_collector: bool = True
dispose_batch_size: int = 0
Where:
-
enable_background_collector ->
defaults 1
: if True starts a background worker that disposes expired and idle connections maintaining requested pool state. If False the connections will be disposed on each connection release. -
idle_timeout ->
defaults 30.0
: inactivity time (in seconds
) after which an extra connection will be disposed (a connection considered as extra if the number of endpoint connection exceeds min_idle). -
max_lifetime ->
defaults 600.0
: number of seconds after which any connection will be disposed. -
min_idle_connections ->
default 3
: minimum number of connections for the ahnlich db endpoint the pool tries to hold. Connections that exceed that number will be considered as extra and disposed after idle_timeout seconds of inactivity. -
max_pool_size ->
defaults 10
: maximum number of connections in the pool. -
dispose_batch_size: maximum number of expired and idle connections to be disposed on connection release (if background collector is started the parameter is ignored).
Requests - DB
Ping
from ahnlich_client_py import AhnlichDBClient
client = AhnlichDBClient(address="127.0.0.1", port=port)
tracing_id = "00-80e1afed08e019fc1110464cfa66635c-7a085853722dc6d2-01"
response = client.ping(tracing_id)
Info Server
from ahnlich_client_py import AhnlichDBClient
client = AhnlichDBClient(address="127.0.0.1", port=port)
response = client.info_server()
List Connected Clients
from ahnlich_client_py import AhnlichDBClient
client = AhnlichDBClient(address="127.0.0.1", port=port)
response = client.list_clients()
List Stores
from ahnlich_client_py import AhnlichDBClient
client = AhnlichDBClient(address="127.0.0.1", port=port)
tracing_id = "00-80e1afed08e019fc1110464cfa66635c-7a085853722dc6d2-01"
response = client.list_stores(tracing_id)
Create Store
from ahnlich_client_py import AhnlichDBClient
client = AhnlichDBClient(address="127.0.0.1", port=port)
response = client.create_store(
store_name = "test store",
dimension = 5,
create_predicates = [
"job"
],
error_if_exists=True
)
Once store dimension is fixed, all store_keys
must confirm with said dimension.
Note we only accept 1 dimensional arrays/vectors of length N.
Store dimensions is a one dimensional array of length N
Set
from libs import create_store_key
from ahnlich_client_py import AhnlichDBClient
client = AhnlichDBClient(address="127.0.0.1", port=port)
store_key = create_store_key(data=[5.0, 3.0, 4.0, 3.9, 4.9])
store_value = {"rank": query.MetadataValue__RawString(value="chunin")}
response = client.set(
store_name = "test store",
inputs=[(store_key, store_value)]
)
Drop store
from ahnlich_client_py import AhnlichDBClient
client = AhnlichDBClient(address="127.0.0.1", port=port)
response = client.drop_store(
store_name = "test store",
error_if_not_exists=True
)
Get Sim N
Returns an array of tuple of (store_key, store_value) of Maximum specified N
from ahnlich_client_py import AhnlichDBClient
client = AhnlichDBClient(address="127.0.0.1", port=port)
response = client.get_sim_n(
store_name = "test store",
search_input = key,
closest_n = 3,
algorithm = query.Algorithm__CosineSimilarity(),
condition = None,
tracing_id=None,
)
Closest_n is a Nonzero integer value
Get Key
Returns an array of tuple of (store_key, store_value)
from ahnlich_client_py import AhnlichDBClient
client = AhnlichDBClient(address="127.0.0.1", port=port)
key = some_store_key
response = client.get_key(
store_name = "test store",
keys=[key]
)
Get By Predicate
Same as Get_key but returns results based defined conditions
from ahnlich_client_py import AhnlichDBClient
client = AhnlichDBClient(address="127.0.0.1", port=port)
condition = query.PredicateCondition__Value(
query.Predicate__Equals(
key="job",
value=query.MetadataValue__RawString(value="sorcerer")
)
)
response = client.get_by_predicate(
store_name = "test store",
condition=conditon
)
Create Predicate Index
from ahnlich_client_py import AhnlichDBClient
client = AhnlichDBClient(address="127.0.0.1", port=port)
response = client.create_pred_index(
store_name = "test store",
predicates=["job", "rank"]
)
Drop Predicate Index
from ahnlich_client_py import AhnlichDBClient
client = AhnlichDBClient(address="127.0.0.1", port=port)
response = client.drop_pred_index(
store_name = "test store",
predicates=["job"],
error_if_not_exists=True
)
Create Non Linear Algorithm Index
from ahnlich_client_py import AhnlichDBClient
client = AhnlichDBClient(address="127.0.0.1", port=port)
response = client.create_non_linear_algorithm_index(
store_name = "test store",
non_linear_indices=[NonLinearAlgorithm__KDTree],
tracing_id = None
)
Drop Non Linear Algorithm Index
from ahnlich_client_py import AhnlichDBClient
client = AhnlichDBClient(address="127.0.0.1", port=port)
response = client.drop_non_linear_algorithm_index(
store_name = "test store",
non_linear_indices=[NonLinearAlgorithm__KDTree],
error_if_not_exists=True,
tracing_id = None
)
Delete Key
from ahnlich_client_py.libs import create_store_key
from ahnlich_client_py import AhnlichDBClient
client = AhnlichDBClient(address="127.0.0.1", port=port)
store_key = create_store_key(data=[5.0, 3.0, 4.0, 3.9, 4.9])
response = client.delete_key(
store_name = "test store",
keys=[store_key]
)
Delete Predicate
from ahnlich_client_py import AhnlichDBClient
client = AhnlichDBClient(address="127.0.0.1", port=port)
condition = query.PredicateCondition__Value(
query.Predicate__Equals(
key="job",
value=query.MetadataValue__RawString(value="sorcerer")
)
)
response = client.delete_predicate(
store_name = "test store",
condition = condition
)
Requests - AI
Ping
from ahnlich_client_py import AhnlichAIClient
client = AhnlichAIClient(address="127.0.0.1", port=port)
response = client.ping(tracing_id)
Info Server
from ahnlich_client_py import AhnlichAIClient
client = AhnlichAIClient(address="127.0.0.1", port=port)
response = client.info_server(tracing_id)
List Stores
from ahnlich_client_py import AhnlichAIClient
client = AhnlichAIClient(address="127.0.0.1", port=port)
response = client.list_stores(tracing_id)
Create Store
from ahnlich_client_py import AhnlichAIClient
from ahnlich_client_py.internals import ai_query
client = AhnlichAIClient(address="127.0.0.1", port=port)
response = client.create_store(
store_name = "test store",
model = ai_query.AIModel__AllMiniLML6V2(),
store_type = ai_query.AIStoreType__RawString(),
predicates = [
"job"
],
non_linear_indices= [],
error_if_exists = True,
# Store original controls if we choose to store the raw inputs
# within the DB in order to be able to retrieve the originals again
# during query, else only store values are returned
store_original = True,
tracing_id=None,
)
Set
from ahnlich_client_py import AhnlichAIClient
from ahnlich_client_py.internals import ai_query
client = AhnlichAIClient(address="127.0.0.1", port=port)
store_inputs = [
(
ai_query.StoreInput__RawString("Jordan One"),
{"brand": ai_query.MetadataValue__RawString("Nike")},
),
(
ai_query.StoreInput__RawString("Yeezey"),
{"brand": ai_query.MetadataValue__RawString("Adidas")},
),
]
response = client.set(
store_name = "test store",
inputs=store_inputs,
tracing_id=None
)
Drop store
from ahnlich_client_py import AhnlichAIClient
client = AhnlichAIClient(address="127.0.0.1", port=port)
response = client.drop_store(
store_name = "test store",
error_if_not_exists=True,
tracing_id=None
)
Get Sim N
Returns an array of tuple of (store_key, store_value) of Maximum specified N
from ahnlich_client_py import AhnlichAIClient
from ahnlich_client_py.internals import ai_query
client = AhnlichAIClient(address="127.0.0.1", port=port)
search_input = ai_query.StoreInput__RawString("Jordan")
response = client.get_sim_n(
store_name = "test store",
search_input = search_input,
closest_n = 3,
algorithm = query.Algorithm__CosineSimilarity(),
condition = None,
tracing_id=None
)
Closest_n is a Nonzero integer value
Get By Predicate
Same as Get_key but returns results based defined conditions
from ahnlich_client_py import AhnlichAIClient
client = AhnlichAIClient(address="127.0.0.1", port=port)
condition = query.PredicateCondition__Value(
query.Predicate__Equals(
key="brand",
value=query.MetadataValue__RawString(value="Nike")
)
)
response = client.get_by_predicate(
store_name = "test store",
condition=conditon,
tracing_id=None,
)
Create Predicate Index
from ahnlich_client_py import AhnlichAIClient
client = AhnlichAIClient(address="127.0.0.1", port=port)
response = client.create_pred_index(
store_name = "test store",
predicates=["job", "rank"],
tracing_id=None,
)
Drop Predicate Index
from ahnlich_client_py import AhnlichAIClient
client = AhnlichAIClient(address="127.0.0.1", port=port)
response = client.drop_pred_index(
store_name = "test store",
predicates=["job"],
error_if_not_exists=True,
tracing_id=None,
)
Create Non Linear Algorithm Index
from ahnlich_client_py import AhnlichAIClient
client = AhnlichAIClient(address="127.0.0.1", port=port)
response = client.create_non_linear_algorithm_index(
store_name = "test store",
non_linear_indices=[NonLinearAlgorithm__KDTree],
tracing_id = None
)
Drop Non Linear Algorithm Index
from ahnlich_client_py import AhnlichAIClient
client = AhnlichAIClient(address="127.0.0.1", port=port)
response = client.drop_non_linear_algorithm_index(
store_name = "test store",
non_linear_indices=[NonLinearAlgorithm__KDTree],
error_if_not_exists=True,
tracing_id = None
)
Delete Key
from ahnlich_client_py import AhnlichAIClient
client = AhnlichAIClient(address="127.0.0.1", port=port)
key = ai_query.StoreInput__RawString("Custom Made Jordan 4")
response = client.delete_key(
store_name = "test store",
keys=[key],
tracing_id=None
)
Bulk Requests
Clients have the ability to send multiple requests at once, and these requests will be handled sequentially. The builder class takes care of this. The response is a list of all individual request responses.
from ahnlich_client_py import AhnlichDBClient
client = AhnlichDBClient(address="127.0.0.1", port=port)
request_builder = client.pipeline()
request_builder.ping()
request_builder.info_server()
request_builder.list_clients()
request_builder.list_stores()
response: server_response.ServerResult = client.exec()
Sample applies to the AIclient
Client As Context Manager
The DB and AI client class can be used as a context manager hereby closing the connection pool automatically upon context end.
from ahnlich_client_py import AhnlichDBClient
with client.AhnlichDBClient(address="127.0.0.1", port=port) as db_client:
response: server_response.ServerResult = db_client.ping()
However, closing the connection pool can be done by calling cleanup()
on the client.
Deploy to Artifactory
Replace the contents of MSG_TAG
file with your new tag message
From Feature branch, either use the makefile :
make bump-py-client BUMP_RULE=[major, minor, patch]
or
poetry run bumpversion [major, minor, patch]
When Your PR is made, changes in the client version file would trigger a release build to Pypi
Type Meanings
- Store Key: A one dimensional vector
- Store Value: A Dictionary containing texts or binary associated with a storekey
- Store Predicates: Or Predicate indices are basically indices that improves the filtering of store_values
- Predicates: These are operations that can be used to filter data(Equals, NotEquals, Contains, etc)
- PredicateConditions: They are conditions that utilize one predicate or tie Multiple predicates together using the AND, OR or Value operation. Where Value means just a predicate. Example: Value
condition = db_query.PredicateCondition__Value(
db_query.Predicate__Equals(key="job", value=db_query.MetadataValue__RawString(value="sorcerer"))
)
Metadatavalue can also be a binary(list of u8s)
condition = db_query.PredicateCondition__Value(
db_query.Predicate__Equals(key="image_data", value=db_query.MetadataValue__Image(value=[2,2,3,4,5,6,7]))
)
AND
# And[tuples[predicate_conditions]]
condition = db_query.PredicateCondition__AND(
(
db_query.PredicateCondition__Value(
db_query.Predicate__Equals(key="job", db_query.MetadataValue__RawString(value="sorcerer"))
),
db_query.PredicateCondition__Value(
db_query.Predicate__Equals(key="rank", value=db_query.MetadataValue__RawString(value="chunin"))
)
)
)
-
Search Input: A string or binary file that can be stored by the aiproxy. Note, the binary file depends on the supported models used in a store or supported by Ahnlich AI
-
AIModels: Supported AI models used by ahnlich ai
-
AIStoreType: A type of store to be created. Either a Binary or String
Change Log
Version | Description |
---|---|
0.0.0 | Base Python clients (Async and Sync) to connect to ahnlich db and AI, with connection pooling and Bincode serialization and deserialization |
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file ahnlich_client_py-0.0.0.tar.gz
.
File metadata
- Download URL: ahnlich_client_py-0.0.0.tar.gz
- Upload date:
- Size: 23.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.7.0 CPython/3.11.10 Linux/6.5.0-1025-azure
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 |
8fa6361654a1d76e94c7948c4267b5b54a0ab3fd7fa773144c09e4ea245ec7e3
|
|
MD5 |
3a822cd143138ec01aaa096dcbbef07b
|
|
BLAKE2b-256 |
d2cacfc497628873974cbc72a678b3a06d31e6caf2ad14c367edb009bc80789b
|
File details
Details for the file ahnlich_client_py-0.0.0-py3-none-any.whl
.
File metadata
- Download URL: ahnlich_client_py-0.0.0-py3-none-any.whl
- Upload date:
- Size: 33.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.7.0 CPython/3.11.10 Linux/6.5.0-1025-azure
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 |
360734052761f7b42af8cb6306267fae8a467f2f4c716d5e58cf911867d227eb
|
|
MD5 |
fda2bdb17620a33f1ff47d213173e7c9
|
|
BLAKE2b-256 |
7f13ebd850abbbb1d3ac4d2a1f79b66e4317342e6e0b210b152ad1dcb5982500
|